Crowdsourcing and Citizen Science

Hello, and welcome to this short talk about Crowdsourcing and Citizen Science. In this session will discuss what we mean by citizen science or crowdsourcing and when a researcher might want to use these methods. We’ll look at a couple of examples, and also talk about how to manage practicalities such as participant motivation and quality control.

Crowdsourcing and Citizen Science are worth introducing together

The term crowdsourcing was coined in 2006 by Wired magazine author Jeff Howe, as a variant of “outsourcing”. Since then the term has taken on a somewhat broader meaning, spanning a variety of ways to attract participants and divide work among them to achieve a cumulative result. This often involves an open call for participants. Participants may be invited, for example, to complete small tasks online in exchange for payment, or some other reward. However, the term is also sometimes used in a colloquial way to refer to an informal process of gathering ideas from any group.

In research settings, crowdsourcing is often used to process data in an operational way, and contributors are not the subject of the research. However, in some cases, contributors can be formally enrolled into a research project as research subjects, for example, as survey respondents. Doing this requires careful treatment of matters such as informed consent and confidentiality: some specific guidance provided by the UC Berkeley Human Research Protection Program is linked below. Citizen science projects typically recruit participants to join as active hands-on contributors, and not as “research subjects”. Indeed, Alan Irwin — a British sociologist who was one of the first people to use the term citizen science — puts the emphasis on “developing concepts of scientific citizenship

\[,\]

which foregrounds the necessity of opening up science and science policy processes to the public”. This way of thinking about citizen science goes beyond the hands-on practicalities of gathering and analysing data. Nevertheless, the practicalities are a good place to start.

These are simple forms of “participatory research”

Both crowdsourcing and citizen science projects typically assume that participants will restrict their participation to carrying out discrete bite-sized tasks — for example, classifying texts or images, or gathering samples. Typically, these small pieces of work don’t require significant expertise to complete — if you find yourself designing a project which needs lots of engaged expert contributions, you may need to adopt an entirely different research design. Other aspects of participatory research which are beyond the typical scope of crowdsourcing and citizen science projects are the subject of a separate video in this series.

Benefits: Why might you want to use crowdsourcing and citizen science methods?

Both crowdsourcing and citizen science open up aspects of a research process to participation from people who might not otherwise contribute to research. This can come with quite a number of potential benefits. For instance, opening up a research process in this way can be inexpensive in comparison to other ways to gather data. Indeed, these methods can also yield a broader range of data than can be readily sourced through other means. Your research may benefit considerably from the unexpected observations and innovative thinking that your participants provide. Taking part in a research project can have benefits to participants as well; and, indeed, there may be further benefits to all parties, and the public at large, building on public engagement with the research process. Let’s have a look at an example.

An early example

Contemporary crowdsourcing and citizen science are often facilitated by computer technology. However, similar approaches were in use much earlier. One stand-out example was the creation of The Oxford English Dictionary: James Murray, a Victorian lexicographer, was the lead editor of the team responsible for the OED. Faced with the enormous task of producing a comprehensive dictionary, he enlisted the help of dozens of amateur philologists as volunteer researchers to provide quotations illustrating the uses of each meaning of each word, and with evidence for the earliest use of each.

Participant motivation

Training for participants can be crucial in citizen science projects. Firstly, it may be necessary to ensure quality of contribution, whether that is in primary data collection or other tasks such as analysing data that has already been collected. In the OED example, participants had to understand what kinds of submissions were being sought, what form to put them in, and how to go about sourcing them. Moreover, learning new things could be a big part of what participants feel they have signed up for in the first place. Understanding your participants’ motivation and helping them realise their goals will help your project succeed. In connection with this, some citizen science projects invite participants to take on leadership or coordinating roles, or to help support others’ learning.

A more contemporary example

Let’s take a look at a contemporary citizen science project to get a sense of how they work.

Surfers Against Sewage is a UK charity focused primarily on marine conservation and related topics like beach water quality. Recently, it ran a citizen science project to monitor water quality in inland waters. Participants would fill a couple of plastic bottles of river water each week, and post them off to a firm ~~called Pathology Management Services~~ to test for e. coli and other contaminants. Results were aggregated, showing trends and disparities. In order for all of this to work, the participants didn’t have to know much about water testing. They were instructed on how to collect good samples and sterilise their equipment. All further technical details were taken care of by the lab. Participants also didn’t need significant resources beyond their time: Surfers Against Sewage sent out the necessary kit, and paid for the tests. Monthly Zoom calls gave participants an opportunity to look at the data collected so far, and talk about shared concerns (for example, changing regulations around applying for designated Bathing Water status). The project coordinators helped participants navigate obstacles, ranging from problems with the courier to tensions in local groups. Putting it all together, by the end of the summer, SAS were able to announce the headline-grabbing result: “Citizen science data shows that 60% of the inland bathing sites we monitored didn’t meet minimum safety requirements for water users in England.” While similar data could in principle have been gathered by one person who travelled to forty different riverbank sites around the UK each week, that would have been an exhausting job — furthermore, the benefits for participants would have been missed, and the final news story would have had a very different feel.

Risks and risk management strategies

Along with the various benefits touched on earlier, crowdsourcing and citizen science methods come with some risks and limitations. You may decide that the research you’re working with can’t be dealt with ethically or practically by relying on low-paid or unpaid contributors; for example, confidentiality concerns might make these strategies inappropriate for some research settings. In settings where crowdsourcing or citizen science is suitable, from a pragmatic standpoint, the project will need to be made robust against low-quality contributions, reporting inaccuracies and participant drop-out. Some of these risks can be sorted out with a good research design. For example, one typical strategy that can help ensure data quality in crowdsourcing projects is to test out would-be participants on a set of screening questions, in order to assess their suitability as contributors, before they start contributing research data. It may also be necessary to spot-check individual responses, and to screen the contributions for statistical outliers or other indications of less-useful contributions (for example, if your participants have disengaged and started to mindlessly click through the questions).

Citizen science as a route to engagement

On that note, if it employs a solid design, citizen science can work well as a route to “public engagement”; often more so than crowdsourcing. As mentioned earlier, you may choose to include both technical training and more wide reaching activities to help participants develop scientific literacy. If you’re thinking about how to incorporate citizen science into your research, or wondering whether another form of engagement would be better for the aims you have in mind, it is worth having a look at the relevant research support pages, and getting in touch with the Public Engagement team (publicengagement@brookes.ac.uk).

Mechanical Turk (MTurk) for Online Research, https://cphs.berkeley.edu/mechanicalturk.pdf

SAS citizen science project: https://waterquality.sas.org.uk/england/

Irwin, A. 1995. Citizen Science: A Study of People, Expertise and Sustainable Development. London: Routledge.

Bassi, H., Misener, L., & Johnson, A. M. (2020). Crowdsourcing for Research: Perspectives From a Delphi Panel. In SAGE Open (Vol. 10, Issue 4, p. 215824402098075). SAGE Publications. https://doi.org/10.1177/2158244020980751

Moss, A. J., Rosenzweig, C., Robinson, J., Jaffe, S. N., & Litman, L. (2020). Is it Ethical to Use Mechanical Turk for Behavioral Research? Relevant Data from a Representative Survey of MTurk Participants and Wages. Center for Open Science. https://doi.org/10.31234/osf.io/jbc9d

https://www.brookes.ac.uk/sites/research-support/pen/whyengage

CC BY-SA 4.0 Joe Corneli. Last modified: January 14, 2025. Website built with Franklin.jl and the Julia programming language.